Legislature(2023 - 2024)GRUENBERG 120
03/13/2024 01:00 PM House JUDICIARY
Note: the audio and video recordings are distinct records and are obtained from different sources. As such there may be key differences between the two. The audio recordings are captured by our records offices as the official record of the meeting and will have more accurate timestamps. Use the icons to switch between them.
Audio | Topic |
---|---|
Start | |
HB358 | |
HB254 | |
HB278 | |
Adjourn |
* first hearing in first committee of referral
+ teleconferenced
= bill was previously heard/scheduled
+ teleconferenced
= bill was previously heard/scheduled
*+ | HB 358 | TELECONFERENCED | |
+ | HB 254 | TELECONFERENCED | |
*+ | HB 278 | TELECONFERENCED | |
+ | TELECONFERENCED |
HB 358-PROHIBIT AI-ALTERED REPRESENTATIONS 1:05:13 PM CHAIR VANCE announced that the first order of business would be HOUSE BILL NO. 358, "An Act relating to use of artificial intelligence to create or alter a representation of the voice or likeness of an individual." 1:05:31 PM REPRESENTATIVE MIKE CRONK, Alaska State Legislature, prime sponsor, presented HB 358. He shared the sponsor statement [included in the committee packet], which read as follows [original punctuation provided]: In this high-tech world Artificial Intelligence has become a lightning rod for debate. HB 358 is written as simply as possible to address the use of AI to create false identities and cause harm. Literally everything we see and hear now can be called into question. Is it real, is it genuine, or is it AI created or enhanced. Your voice, your image will only be yours and safe from harm if safeguards are put into place. 1:06:20 PM DAVE STANCLIFF, Staff, Representative Mike Cronk, Alaska State Legislature, on behalf of Representative Cronk, prime sponsor of HB 358, presented the sectional analysis [included in the committee packet], which read as follows [original punctuation provided]: Section 1. AS 45.50 is amended adding a new section. Article 6A. Use of Artificial Intelligence to Represent an Individual. Sec. 45.50.860 Unauthorized representation of voice or likeness. • Restrictions in the use of AI to create or alter a person's voice or likeness. • Sets the standard in the restriction of "Intent to cause harm." • Defines: 1. "Artificial Intelligence" 2. Individual" 3. "Likeness representation" 1:08:37 PM MR. STANCLIFF shared several examples of AI generated impersonations of elected officials. One example involved an impersonation of President Biden and impacted 25,000 people. He said the proposed legislation would convey that the legislature is aware of this problem. 1:10:49 PM REPRESENTATIVE GRAY informed the committee that as a podcast host, he would often edit the order of guests' statements for clarity. He sought to verify that the bill would only cover AI generated material. MR. STANCLIFF confirmed. REPRESENTATIVE GRAY opined that rearranging people's words shouldn't be allowed because it would be easy to insert words and reverse the meaning of sentences. He suggested that the amendment should be broader in scope. MR. STANCLIFF said that would make the bill more complicated, and the goal was to keep it narrow and focused. He stressed that the use of this technology in child pornography is rampant. He indicated that the intent to cause harm would be determined by the courts. 1:14:44 PM REPRESENTATIVE CARPENTER questioned the constitutional right of free speech when applied to the use of AI and how to ensure that AI is not unduly influencing. MR. STANCLIFF suspected that if the intent of using AI generated images was to cause harm, that person could be held liable under the proposed legislation. REPRESENTATIVE CARPENTER sought to confirm that the bill seeks to modify AS 45.50. MR. STANCLIFF answered yes. REPRESENTATIVE CARPENTER asked whether there was any discussion with Legislative Legal Services about the definition of harm. MR. STANCLIFF stated that the definition was kept intentionally broad. 1:18:09 PM REPRESENTATIVE SUMNER pointed out that the proposed definition of AI contemplates a machine-based system with varying levels of autonomy. He asked whether the bill sponsor had considered a scenario in which the AI system has complete autonomy and asked who would be liable. MR. STANCLIFF pointed out that AI is widely used for many great purposes. He reiterated that liability would depend on the intent to cause harm. 1:20:44 PM REPRESENTATIVE GROH asked whether this issue had been addressed in other states. MR. STANCLIFF shared his belief that at least 35 other states are delving into this in one way or another. In addition, deep fakes are being addressed by national legislation and technology companies are being asked to be more responsible. REPRESENTATIVE GROH questioned the origins of the proposed damages of up to $2,000 for the first violation not to exceed $2,500 for second or subsequent violations. MR. STANCLIFF said the bill sponsor was open to the committee's suggestions. REPRESENTATIVE GROH addressed the concept of intent. He pointed out that in most scenarios, both parties typically believe they are doing the right thing. He asked how that kind of thinking played into the intent to cause harm standard. MR. STANCLIFF commented on the ability to convince a judge and jury that someone was harmed [by the use of AI]. REPRESENTATIVE GROH asked whether [the proposed damages] were enough. MR. STANCLIFF reiterated his hope that the committee would delve deeper into these issues. He characterized the bill as a starting point. 1:26:30 PM REPRESENTATIVE CARPENTER clarified that he was supportive of the bill's intent. He asked whether the definition of AI matters if its purpose is to cause harm. In addition, he asked who would be responsible when the AI's response is counter to its intent. MR. STANCLIFF responded, "If your finger hit the keyboard, it could come back to bite you." He reiterated that if the use of AI is intended to cause harm, bully someone, or destroy a business, the bill would open up the door for more efficient and justifiable civil action. 1:30:13 PM REPRESENTATIVE SUMNER suggested that an AI program of a certain level of autonomy could be [defined as] a person for the purposes of liability under the proposed legislation. CHAIR VANCE clarified that the bill would create civil liability, not criminal. REPRESENTATIVE SUMNER pointed out that in terms of civil liability, incarcerating a computer program would not be entirely possible. 1:32:04 PM REPRESENTATIVE GRAY gave an example involving two candidates similar in race, age, gender, and height. One of the candidates presents himself/herself as 5 inches taller with the use of Photoshop and wins the race. Considering that the taller candidate tends to win in presidential races, he asked whether the use of Photoshop would be considered AI and whether harm was done. MR. STANCLIFF answered no. He reiterated that the bill pertains to AI that's used to enhance or change an image with the intent to cause damage. In contrast, if AI was used to make your opponent appear shorter or to alter his/her speech in a strange way, then the bill might apply. He maintained that because it's a civil matter, the courts would need to make a judgement based on intent. 1:34:37 PM REPRESENTATIVE GRAY considered a scenario in which a radio station unintentionally uses a clip of President Biden that was AI generated and asked whether the radio station would be held liable. MR. STANCLIFF suggested that the committee should consider indemnifying people who can present an honest answer for airing AI generated content. He pointed out that AI has already been used in politics. He emphasized that the bill would be a starting point for identity protection. REPRESENTATIVE GRAY opined that $2,000 is not a deterrent to media companies. REPRESENTATIVE CARPENTER stated that intent matters. He likened AI generation to an uncontrollable forest fire, indicating that regardless of intent, the person who "pushed the button" should be held responsible. 1:38:57 PM REPRESENTATIVE C. JOHNSON pointed out that the Federal Communications Commission (FCC) has very stringent guidelines that does not include the use of AI. REPRESENTATIVE GRAY stated, "One person's spoof is another person's truth," and shared his belief that [the use of AI on the radio] is something to be concerned about. REPRESENTATIVE GROH suggested that political satire or parody should not be sanctioned due to First Amendment protections. He emphasized the difficulty of this task given the novelty and importance of the topic. 1:43:37 PM CHAIR VANCE announced that HB 358 would be held over.
Document Name | Date/Time | Subjects |
---|---|---|
HB 358 - Sponsor Statement.pdf |
HFSH 3/25/2024 1:00:00 PM HJUD 3/13/2024 1:00:00 PM HJUD 3/15/2024 1:00:00 PM HJUD 3/25/2024 1:00:00 PM |
HB 358 |
HB 358 - v.A.pdf |
HFSH 3/25/2024 1:00:00 PM HJUD 3/13/2024 1:00:00 PM HJUD 3/15/2024 1:00:00 PM |
HB 358 |
HB 358 - Sectional Analysis.pdf |
HJUD 3/13/2024 1:00:00 PM HJUD 3/15/2024 1:00:00 PM HJUD 3/25/2024 1:00:00 PM |
HB 358 |
HB 358 - Statement of Zero Fiscal Impact.pdf |
HJUD 3/13/2024 1:00:00 PM HJUD 3/15/2024 1:00:00 PM HJUD 3/25/2024 1:00:00 PM |
HB 358 |
HB 358 - Alaska Broadcasters Association - Support of Policy.pdf |
HJUD 3/13/2024 1:00:00 PM HJUD 3/15/2024 1:00:00 PM HJUD 3/25/2024 1:00:00 PM |
HB 358 |
HB 358 - Backup Document Articles & Research.pdf |
HJUD 3/13/2024 1:00:00 PM HJUD 3/15/2024 1:00:00 PM HJUD 3/25/2024 1:00:00 PM |
HB 358 |
HB 254 - Sponsor Statement.pdf |
HJUD 3/13/2024 1:00:00 PM HJUD 3/15/2024 1:00:00 PM |
HB 254 |
HB 254 - v.A.pdf |
HJUD 3/13/2024 1:00:00 PM HJUD 3/15/2024 1:00:00 PM |
HB 254 |
HB 254 - Sectional Analysis.pdf |
HJUD 3/13/2024 1:00:00 PM HJUD 3/15/2024 1:00:00 PM |
HB 254 |
HB 254 - Slideshow Presentation.pdf |
HJUD 3/13/2024 1:00:00 PM HJUD 3/15/2024 1:00:00 PM |
HB 254 |
HB 254 - Letters of Support.pdf |
HJUD 3/13/2024 1:00:00 PM HJUD 3/15/2024 1:00:00 PM |
HB 254 |
HB 254 - Top 13 Age Verification APIs in 2023.pdf |
HJUD 3/13/2024 1:00:00 PM HJUD 3/15/2024 1:00:00 PM |
HB 254 |
HB 254 - Age Verfication Petition.pdf |
HJUD 3/13/2024 1:00:00 PM HJUD 3/15/2024 1:00:00 PM |
HB 254 |
HB 278 - Sponsor Statement.pdf |
HJUD 3/13/2024 1:00:00 PM HSTA 4/23/2024 3:00:00 PM |
HB 278 |
HB 278 - v.A.pdf |
HJUD 3/13/2024 1:00:00 PM HSTA 4/23/2024 3:00:00 PM |
HB 278 |
HB 278 - Sectional Analysis.pdf |
HJUD 3/13/2024 1:00:00 PM HSTA 4/23/2024 3:00:00 PM |
HB 278 |
HB 278 - Statement of Zero Fiscal Impact.pdf |
HJUD 3/13/2024 1:00:00 PM HSTA 4/23/2024 3:00:00 PM |
HB 278 |
HB 254 - Statement of Zero Fiscal Impact.pdf |
HJUD 3/13/2024 1:00:00 PM |
HB 254 |